Self-Scaling Parallel Quasi-Newton Methods
نویسندگان
چکیده
In this paper, a new class of self-scaling quasi-Newton(SSQN) updates for solving unconstrained nonlinear optimization problems(UNOPs) is proposed. It is shown that many existing QN updates can be considered as special cases of the new family. Parallel SSQN algorithms based on this class of class of updates are studied. In comparison to standard serial QN methods, proposed parallel SSQN(SSPQN) algorithms show signi cant improvement in the total number of iterations and function/gradient evaluations required in solving a wide range of test problems. In fact, the average speedup factors by the new SSPQN algorithms over the conventional BFGS method and E04DGE in NAG library are 3:22=3:13 and 2:80=3:09 , respectively(in terms of total number of iterations and total number of function/gradient evaluations required). For some test problems, the speedup factor gained by the new algorithms can be as high as 25/25 over BFGS and 20/25 over E04DGE , both in terms of total number of iterations and function/gradient evaluations.
منابع مشابه
Wide interval for efficient self-scaling quasi-Newton algorithms
This paper uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstraiend optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from the so-called convex class, but increase their c...
متن کاملA Self-Correcting Variable-Metric Algorithm for Stochastic Optimization
An algorithm for stochastic (convex or nonconvex) optimization is presented. The algorithm is variable-metric in the sense that, in each iteration, the step is computed through the product of a symmetric positive definite scaling matrix and a stochastic (mini-batch) gradient of the objective function, where the sequence of scaling matrices is updated dynamically by the algorithm. A key feature ...
متن کاملA combined class of self-scaling and modified quasi-Newton methods
Techniques for obtaining safely positive definite Hessian approximations with selfscaling and modified quasi-Newton updates are combined to obtain ‘better’ curvature approximations in line search methods for unconstrained optimization. It is shown that this class of methods, like the BFGS method has global and superlinear convergence for convex functions. Numerical experiments with this class, ...
متن کاملOn a bilinear optimization problem in parallel magnetic resonance imaging
This work is concerned with the structure of bilinear minimization problems arising in recovering subsampled and modulated images in parallel magnetic resonance imaging. By considering a physically reasonable simplified model exhibiting the same fundamental mathematical difficulties, it is shown that such problems suffer from poor gradient scaling and non-convexity, which causes standard optimi...
متن کاملSelf-Scaling Variable Metric Algorithms Without Line
This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998